Classification via kernel regression based on univariate product density estimators

نویسندگان

  • Bezza Hafidi
  • Abdelkarim Merbouha
  • Abdallah Mkhadri
چکیده

We propose a nonparametric discrimination method based on a nonparametric Nadaray-Watson kernel regression type-estimator of the posterior probability that an incoming observed vector is a given class. To overcome the curse of dimensionality of the multivariate kernel density estimate, we introduce a variance stabilizing approach which constructs independent predictor variables. Then, the multivariate kernel estimator is replaced by the univariate kernel product estimators. The new procedure is illustrated in simulated data sets and real example, confirming the usefulness of our approach.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Classification via kernel product estimators

Multivariate kernel density estimation is often used as the basis for a nonparametric classification technique. However, the multivariate kernel classifier suffers from the curse of dimensionality, requiring inordinately large sample sizes to achieve a reasonable degree of accuracy in high dimensional settings. A variance stabilising approach to kernel classification can be motivated through an...

متن کامل

Comparison of the Gamma kernel and the orthogonal series methods of density estimation

The standard kernel density estimator suffers from a boundary bias issue for probability density function of distributions on the positive real line. The Gamma kernel estimators and orthogonal series estimators are two alternatives which are free of boundary bias. In this paper, a simulation study is conducted to compare small-sample performance of the Gamma kernel estimators and the orthog...

متن کامل

Variance estimation in nonparametric regression via the difference sequence method (short title: Sequence-based variance estimation)

Consider a Gaussian nonparametric regression problem having both an unknown mean function and unknown variance function. This article presents a class of difference-based kernel estimators for the variance function. Optimal convergence rates that are uniform over broad functional classes and bandwidths are fully characterized, and asymptotic normality is also established. We also show that for ...

متن کامل

Estimating a Function by Local Linear Regressionwhen

1 AMS 1991 subject classiications: primary 62G05; secondary 62J99. Abstract Automated bandwidth selection methods for nonparametric regression break down in the presence of correlated errors. While this problem has been previously studied in the context of kernel regression, the results to date have only been applicable to univariate observations following an equidistant design. This article ad...

متن کامل

Mean Integrated Squared Error of Nonlinear Wavelet-based Estimators with Long Memory Data

We consider the nonparametric regression model with long memory data that are not necessarily Gaussian and provide an asymptotic expansion for the mean integrated squared error (MISE) of nonlinear wavelet-based mean regression function estimators. We show this MISE expansion, when the underlying mean regression function is only piecewise smooth, is the same as analogous expansion for the kernel...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005